Chinese Operatives Use AI-Generated Images to Spread Disinformation Ahead of 2024 US Election, Microsoft Warns

Microsoft analysts reveal how suspected Chinese operatives are leveraging artificial intelligence to manipulate American voters and provoke political discourse online.
Introduction: As the 2024 US election looms, Microsoft analysts have issued a warning about suspected Chinese operatives using artificial intelligence (AI) to disseminate disinformation and spark discussions on divisive political issues. Over the past nine months, these operatives have employed AI-generated images featuring the Statue of Liberty and the Black Lives Matter movement to denigrate US political figures and symbols. Microsoft's findings shed light on the growing concern of foreign adversaries exploiting AI to manipulate American voters and amplify an already polarized domestic information environment.
AI-Generated Images Fuel Chinese Influence Network's Campaign
Microsoft's investigation revealed that the alleged Chinese influence network utilized a series of accounts on Western social media platforms to share AI-generated images. These images, created by computers, were designed to mimic real American voters and were disseminated by real individuals, either knowingly or unknowingly, through reposts on social media. Microsoft further stated that these social media accounts were affiliated with the Chinese Communist Party, highlighting the involvement of state actors in this disinformation campaign.
The Threat of AI-Driven Disinformation in US Elections
The potential for adversaries to exploit AI technology to sow disinformation among US voters is a significant concern for election officials, particularly in anticipation of a potentially contentious 2024 rematch between President Joe Biden and former President Donald Trump. With 69% of Republicans and Republican-leaners still expressing doubts about the legitimacy of Biden's 2020 win, the information environment is already ripe for manipulation. Microsoft's Clint Watts emphasized that China is likely to continue refining this technology to improve its accuracy, leaving the question of when and how it will be deployed at scale.
AI-Made Images Garner High Engagement on Social Media
According to Microsoft, the AI-generated images shared by the Chinese influence network have garnered higher levels of engagement from authentic social media users compared to previous posts by the same network. Although specific metrics were not provided, this finding underscores the effectiveness of AI-driven disinformation campaigns in capturing public attention and perpetuating divisive narratives.
Chinese Government's Response and Previous Instances of Influence Operations
CNN has reached out to the Chinese Embassy in Washington, DC, for comment on the Microsoft report. The Chinese government has consistently denied allegations of using hacking or disinformation to interfere in US affairs. However, evidence of Chinese influence operations aimed at sowing discord in the US has surfaced in recent months. In July, researchers from security firm Mandiant revealed that pro-Beijing operatives had paid unwitting Americans to protest racial inequality and a US ban on goods from Xinjiang. Additionally, Meta, the parent company of Facebook, recently dismantled a large-scale covert influence operation involving thousands of China-based social media accounts targeting audiences in the US, Taiwan, and elsewhere. Conclusion: Microsoft's warning about suspected Chinese operatives utilizing AI-generated images to spread disinformation and provoke discussions ahead of the 2024 US election sheds light on the evolving landscape of information warfare. As foreign adversaries continue to exploit emerging technologies, election officials and social media platforms face the challenge of identifying and mitigating the impact of AI-driven disinformation campaigns. The findings also highlight the urgent need for public awareness and critical thinking to counter the manipulation of online discourse and safeguard the integrity of democratic processes.